A characteristic condition for convergence of steepest descent approximation to accretive operator equations
نویسندگان
چکیده
منابع مشابه
Convergence theorems of iterative approximation for finding zeros of accretive operator and fixed points problems
In this paper we propose and studied a new composite iterative scheme with certain control con-ditions for viscosity approximation for a zero of accretive operator and xed points problems in areflexive Banach space with weakly continuous duality mapping. Strong convergence of the sequencefxng dened by the new introduced iterative sequence is proved. The main results improve andcomplement the co...
متن کاملA Geometric Convergence Theory for the Preconditioned Steepest Descent Iteration
Preconditioned gradient iterations for very large eigenvalue problems are efficient solvers with growing popularity. However, only for the simplest preconditioned eigensolver, namely the preconditioned gradient iteration (or preconditioned inverse iteration) with fixed step size, sharp non-asymptotic convergence estimates are known. These estimates require a properly scaled preconditioner. In t...
متن کاملDoubly Degenerate Diiusion Equations as Steepest Descent
For p 2 (1; 1) and n > 0 we consider the scalar doubly degenerate diiusion equation @ t s ? div(jrs n j p?2 rs n) = 0 in (0; 1) (1) with no{{ux boundary conditions. We argue that this evolution problem can be understood as steepest descent of the convex functional sign(m ? 1) Z s m ; provided m := n + p ? 2 p ? 1 > 0 ; (2) w. r. t. the Wasserstein metric of order p on the space of probability d...
متن کاملResidual norm steepest descent based iterative algorithms for Sylvester tensor equations
Consider the following consistent Sylvester tensor equation[mathscr{X}times_1 A +mathscr{X}times_2 B+mathscr{X}times_3 C=mathscr{D},]where the matrices $A,B, C$ and the tensor $mathscr{D}$ are given and $mathscr{X}$ is the unknown tensor. The current paper concerns with examining a simple and neat framework for accelerating the speed of convergence of the gradient-based iterative algorithm and ...
متن کاملGlobal Convergence of Steepest Descent for Quadratic Functions
This paper analyzes the effect of momentum on steepest descent training for quadratic performance functions. Some global convergence conditions of the steepest descent algorithm are obtained by directly analyzing the exact momentum equations for quadratic cost functions. Those conditions can be directly derived from the parameters (different from eigenvalues that are used in the existed ones.) ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Journal of Mathematical Analysis and Applications
سال: 2002
ISSN: 0022-247X
DOI: 10.1016/s0022-247x(02)00122-1